Nearest Neighbor Regression with Heavy-Tailed Errors

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric quantile regression with heavy-tailed and strongly dependent errors

We consider nonparametric estimation of the conditional qth quantile for stationary time series. We deal with stationary time series with strong time dependence and heavy tails under the setting of random design. We estimate the conditional qth quantile by local linear regression and investigate the asymptotic properties. It is shown that the asymptotic properties are affected by both the time ...

متن کامل

Unsupervised K-Nearest Neighbor Regression

In many scientific disciplines structures in highdimensional data have to be found, e.g., in stellar spectra, in genome data, or in face recognition tasks. In this work we present a novel approach to non-linear dimensionality reduction. It is based on fitting K-nearest neighbor regression to the unsupervised regression framework for learning of low-dimensional manifolds. Similar to related appr...

متن کامل

Fractional Regression Nearest Neighbor Imputation

Sample surveys typically gather information on a sample of units from a finite population and assign survey weights to the sampled units. Survey frequently have missing values for some variables for some units. Fractional regression imputation creates multiple values for each missing value by adding randomly selected empirical residuals to predicted values. Fractional imputation methods assign ...

متن کامل

Heavy-Tailed Symmetric Stochastic Neighbor Embedding

Stochastic Neighbor Embedding (SNE) has shown to be quite promising for data visualization. Currently, the most popular implementation, t-SNE, is restricted to a particular Student t-distribution as its embedding distribution. Moreover, it uses a gradient descent algorithm that may require users to tune parameters such as the learning step size, momentum, etc., in finding its optimum. In this p...

متن کامل

Discriminant Adaptive Nearest Neighbor Classification and Regression

Robert Tibshirani Department of Statistics University of Toronto tibs@utstat .toronto.edu Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discriminant analysis to e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Statistics

سال: 1993

ISSN: 0090-5364

DOI: 10.1214/aos/1176349144